| Author |
Thread Statistics | Show CCP posts - 6 post(s) |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.25 22:56:00 -
[1] - Quote
I run two accounts on a single 20 inch display (16:9) I simply change them to minimum 4:3 size and run two on the same display with no issues.
However be aware tho. EVE likes to eat A GREAT deal of ram these days. I suggest two gigs of memory per client at full settings. Reducing settings greatly may help this.
Video memory dosent seem too bad but if you get into any kind of situation where there is alot on both screens it will choke without a good gig or so vram. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.25 23:04:00 -
[2] - Quote
Roh Voleto wrote:First of all: You want two monitors anyway. After only a few days you'll start asking yourself how you made do with just one.
As far as EVE is concerned: I simply play in windowed mode. This way you can switch between clients just like any other application.
Lack of space on my desk. Also I like that my single display only uses 20 watts. My old display used closer to 100 and the electric bill was starting to reflect this.
Now I have gotten used to this setup and the advantages for me are better than running two. However yes two is better in a classic sense.
For my next PC build. Power efficiency will be king so I can recover enough to run two displays without guilt. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.25 23:11:00 -
[3] - Quote
I don't know much about SLI but would it not be better to unlatch EVE in whatever profile SLI uses and use one GPU per client? |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.25 23:16:00 -
[4] - Quote
Darren Corley wrote:Endeavour Starfleet wrote:I don't know much about SLI but would it not be better to unlatch EVE in whatever profile SLI uses and use one GPU per client? Theoretically, but it may not quite work as expected. You would need the second monitor connected to the second GPU, and select said second GPU in the second client. I've done this under Wine in Linux, but not in Windows. I just use 2 computers now.
I get it now! (Frak! Should have had a V8 *forehead slap*)
If both are connected to the primary GPU I can see where doing this under SLI is far better.
Does it scale properly tho?
Edit: BTW 2 computers? Would hate to see your power bill if you don't have a solar collector or windwill outside your house. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.25 23:35:00 -
[5] - Quote
JonnyRandom wrote:CCP Stillman wrote:Personally I will run up to 3 monitors across 3 screens(Powered by 2x HD 5770s).
You have three monitors on your desk? How do they fit? Do you have a photo?
If electrical power here was 3x cheaper I would run 3 displays myself. Likely the one center one per side slanted towards you. As that enhances immersion. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.25 23:42:00 -
[6] - Quote
It is not that low but I will concede that I am far too picky about that. The recession knocked us back quite a bit and I have learned to be as power conscious as possible.
Yet you must also factor in heat production. While it is winter and there is no difference. In the south and hot summers the extra wattage is involved as heat production. Because the new display is LED compared to the old which was tube light based. The money then to cool the room costs more than its production in the first place. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.25 23:46:00 -
[7] - Quote
Darren Corley wrote:JonnyRandom wrote:Darren Corley wrote:Yes it can be done, and if you're using SLI it's actually easier than expected. You just set the second client to the second GPU and it will put itself on the second monitor. It won;t only use the second GPU, both will render both, it's just in game you can use fixed window and they will go to the appropriate screens.
Now, it is far better to use 2 computers and a program like Synergy or Input Director. Just so much better, and no need to click back to each client to make it active and accept commands again. Synergy? Input Director? What do those do? They're both Keyboard/Mouse switch software. You set up one computer as the master, and any others as clients. Only the Master needs a keyboard and mouse. Those two programs(there's others too) basically allow you to move the mouse offscreen to another computer, like multiple monitors, but with multiple computers. I normally run this character on my OP desktop on a 30" Dell, and my alt account is run @ 1920x1080 on my equally OP gaming desknote, and I use Synergy or Input Director to control both with one keyboard and mouse.
WIndows allows that? or do you have to trick windows by having the software make a ghost display? |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 00:11:00 -
[8] - Quote
I thought it was clients need to be similar for best performance. Different display size is just annoying when you move your mouse back and forth. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 06:12:00 -
[9] - Quote
You dont need 16GB 8 is fine for 3 accounts.
i7 is overkill. That is a very nice turbo but you can get away with a good high end i5 and save a little. The extra oomph EVE wont be using for years and by then the i7 will be obsolete compared to 22nm models.
I run two clients on an ATI 4770 (4850 equivalent) if I run both at full res it will choke some but not terrible. And that is mainly due to the 512mb of ram when 1GB is needed. Running 3 with that monster of a card and a good i5 and 8gb of ram ought to be easy in my opinion. No need for SLI |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 10:36:00 -
[10] - Quote
CCP Stillman wrote:Darren Corley wrote:Sassums wrote:While the 16GB might be over kill - the cost difference between 8 and 16 is just so small - best to do it right. The board can support up to 32GB of Ram, but it jumps the price from $70-90(for 16) to over $300 for 32.
I am getting the Intel Core i7-2700k and the Intel DP67BGB3 board for $250.
Another $100 for RAM
And another 50-75 for CD/DVD Drives since my old ones won't work anymore and I am set. It's still your money in any case, but if you only really game on it, the i7 will be wasted. And with the RAM, unless it's 2 sticks of 8GB, i'd still go with 2x4. Can you actually get 8GB sticks? I've not been able to track down any non-ECC ones. And I've never found any in Iceland nor Denmark.
Maybe ask CCP Atlanta? Or maybe someone can sell you some at fanfest?
As for ram. Generally it is by far best to get the biggest Ram sticks to meet the min requirements for dual or triple channel (2-3) And leave the rest open to buy the same sticks later.
16 tho is just insanely overkill. Just this year has 4gb become a moderate bottleneck and that requires you to have your browser and multiple games running at the same time. before that 1gb had been good for years.
With the cost of game development skyrocketing I just don't see 8gb being a bottleneck for years. As in by the time it does your entire system will be a dinosaur. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 10:41:00 -
[11] - Quote
I am not really worried about stressing the memory controller. Most of the time it is built into the CPU and has the same amount of ruggedness the rest of the system has anyway.
It is just that if you went all out with all them filled off the bat you leave no room for upgrading later. Some do this so they can upgrade their HTPC memory with the old later on but not everyone has an HTPC to use the old stuff. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:03:00 -
[12] - Quote
Edit: Nm I guess that means 4 sticks not actually quad channel considering they mention Phenom II X6
BTW what on earth do you need such an expensive mobo for anyway? |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:08:00 -
[13] - Quote
Sassums wrote:Darren Corley wrote:Sassums wrote:Darren Corley wrote:Endeavour Starfleet wrote:I am not really worried about stressing the memory controller. Most of the time it is built into the CPU and has the same amount of ruggedness the rest of the system has anyway.
It is just that if you went all out with all them filled off the bat you leave no room for upgrading later. Some do this so they can upgrade their HTPC memory with the old later on but not everyone has an HTPC to use the old stuff. It's mainly due to the fact that with all slots filled, it's NOT as rugged as the rest of the CPU. Especially if the system is for games and such, and you like to overclock. Or using high frequency RAM. Not overclocking, 3.5 is more than enough. This is the memory I am looking at: http://www.newegg.com/Product/Product.aspx?Item=N82E16820231441&Tpk=F3-10666CL9Q-16GBXL Considering there are games out there that even 3.5Ghz on Sandy Bridge is not enough.... But who knows if you'll ever play them. That and you don't have to use OC'd RAM to OC SB anyways. Seriously? Well the CPU can turbo up to like 3.9 I think. All I am playing is EVE - will try out TOR eventually. I have a 360 for everything else :)
TOR hates my GPU (its 512mb of fail) but CPU seems to be ok. I don't think you will have much to worry about with TOR.
I still think the i7 is overkill tho. If you are serious about spending those funds perhaps wait for ivy bridge with its 22nm 3D process? |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:12:00 -
[14] - Quote
Darren Corley wrote:Endeavour Starfleet wrote:That memory says it is a quad channel kit.
I dont know much about memory past dual channel but that looks a bit off as the board you are looking at says dual channel. I don't really know if that affects anything but do make sure before hitting buy.
BTW what on earth do you need such an expensive mobo for anyway? That motherboard is cheap. And quad channel is for the new chipset and extreme chips coming out. Since it's 4 sticks of the same stuff, it will work as dual channel too.
Compared to what? What on it warrants such a spending of funds? |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:24:00 -
[15] - Quote
Yes now it is no wonder you found 184 USD to be cheap.
I really REALLY do not want to see your power bill. It would make baby al gore cry....
I know in the real world you have to pay a bit of a premium on Intel hardware compared to AMD. Yet I seriously find it difficult to justify 184 just to gain access to a 2x 16 PCIe slot layout.
Then again sassums did say he is getting one hell of a discount. I can't argue with that.
What GPU are you getting btw? AMDs 7x series is about to launch. And it features one hell of a power system that idles at less than 10W |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:25:00 -
[16] - Quote
DrStone wrote:also...I did a test and was able to run and tab between 35 eve clients on 16Gb ...but again thats bit overkill
At the login screen? Otherwise do you seriously have 35 Eve accounts? (And not bot for that matter? j/k ) |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:32:00 -
[17] - Quote
Darren Corley wrote:Endeavour Starfleet wrote:Yes now it is no wonder you found 184 USD to be cheap.
I really REALLY do not want to see your power bill. It would make baby al gore cry....
I know in the real world you have to pay a bit of a premium on Intel hardware compared to AMD. Yet I seriously find it difficult to justify 184 just to gain access to a 2x 16 PCIe slot layout.
Then again sassums did say he is getting one hell of a discount. I can't argue with that.
What GPU are you getting btw? AMDs 7x series is about to launch. And it features one hell of a power system that idles at less than 10W The power bill is only bad when I actually push the thing. If I leave it at stock 3.33Ghz with Power Saving crud on, the X5680s and the 2x 5970s don't actually pull a whole lot. The bigger problem is that it has two PSUs, and I need two separate circuits for it, or at least a 15A that it can use exclusively.
So if you plug in a killawatt meter it displays "Yer fraked next bill" Or "The heater says day' took er JOBS!!!" Got it! 
Two PSUs... Dear pete.... 
|

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:40:00 -
[18] - Quote
I am against AFK cloaking and I don't think that would be viable anyway as the costs of maintaining such a net of ships would be insane.
If I had a solar roof and bored friends I would dream of running 10 accounts and running incursions just for me... Epic EPIC isk/hr lol... |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:44:00 -
[19] - Quote
Sassums IF you have time to wait I would seriously wait. Or atleast see where the trends go in the early part of 2012 before you buy.
The 22nm stuff is supposed to be killer with power consumption and turbo. Because of that new 3D transistor stuff.
http://www.slashgear.com/intel-ivy-bridge-desktop-cpu-pricing-leaks-20203598/ If that leak is true the TDP is awesomely low. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 11:51:00 -
[20] - Quote
DrStone wrote:Endeavour Starfleet wrote:I am against AFK cloaking and I don't think that would be viable anyway as the costs of maintaining such a net of ships would be insane.
If I had a solar roof and bored friends I would dream of running 10 accounts and running incursions just for me... Epic EPIC isk/hr lol... I'm maintaining healthy 11 acounts..
A bit off topic but if I may ask. Is there really a benefit of running that many? Does it not get to a point where you leave some logged off at many a time because you don't have the will to control that many?
I find two to be hard enough at times but 11? And how does your computer handle that? |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
57
|
Posted - 2011.12.26 12:02:00 -
[21] - Quote
Mining I see. I would have guessed you were doing 10/10s or Sanctums by yourself but then again that would make the job of managing all those ships an absolute nightmare. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
59
|
Posted - 2011.12.27 03:58:00 -
[22] - Quote
LED TV has become the standard. Such as laser tv which in reality is just another light source for the throw. Yes there are the extremely large LED billboards and OLED type tech for phones. Yet for low power display LED is just easier for the layman to understand.
And the cost per inch or per pixel is so different between all three that I doubt anyone is going to be seriously confused for long. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
59
|
Posted - 2011.12.27 05:01:00 -
[23] - Quote
You must be speaking of early generation panels using LED backlit. My current screen looks just as good as my power eating CCFL and I don't have to worry about it heating my room.
Yes I know that is likely a low priority for you. Yet for me and many others it is a great investment. Especially in big screen TVs where the older tech eats as much power as a CRT. |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
60
|
Posted - 2011.12.27 08:18:00 -
[24] - Quote
I wont argue with that.
And we have gotten way off topic anyway. Lets steer it back...
Jonny did you decide what you are going to do to run 2 accounts? |

Endeavour Starfleet
Center for Advanced Studies Gallente Federation
62
|
Posted - 2011.12.27 20:08:00 -
[25] - Quote
You know on that note would it be possible in the future for EVE to have its own config file that can be started with different shortcuts?
Nothing fancy just. a shortcut with the command Client1 or Client2 meaning create a new config file in the folder if there is not one.
Just a thought. |
| |
|